Optimized distance metrics for differential evolution based nearest prototype classifier
نویسندگان
چکیده
In this article, we introduce a differential evolution based classifier with extension for selecting automatically the applied distance measure from a predefined pool of alternative distances measures to suit optimally for classifying the particular data set at hand. The proposed method extends the earlier differential evolution based nearest prototype classifier by extending the optimization process by optimizing not only the required parameters for distance measures, but also optimizing the selection of the distance measure it self in order to find the best possible distance measure for the particular data set at hand. It has been clear for some time that in classification, usual euclidean distance is often not the best choice, and the optimal distance measure depends on the particular properties of the data sets to be classified. So far solving this issue have been subject to a limited attention in the literature. In cases where some consideration to this is problem is given, there has only been testing with couple distance measure to find which one applies best to the data at hand. In this paper we have attempted to take one step further by applying a systematic global optimization approach for selecting the best distance measure from a set of alternative measures for obtaining the highest classification accuracy for the given data. In particular, we have generated pool of distance measures for the purpose and developed a model on how the differential evolution based classifier can be extended to optimize the selection of the distance measure for given data. The obtained results are demonstrating, and also confirming further on the earlier findings reported in the literature, that often some other distance measure than the most commonly used euclidean distance is the best choice. The selection of distance measure is one of the most important factor for obtaining best classification accuracy, and should thereby be emphasized more in future research. The results also indicate that it is possible to build a classifier that is selecting the optimal distance measure for the given data automatically. It is also recommended that the proposed extension the differential evolution based classifier is clearly efficient alternative in solving classification problems.
منابع مشابه
Differential evolution based nearest prototype classifier with optimized distance measures for the features in the data sets
In this paper a further generalization of differential evolution based data classification method is proposed, demonstrated and initially evaluated. The differential evolution classifier is a nearest prototype vector based classifier that applies a global optimization algorithm, differential evolution, for determining the optimal values for all free parameters of the classifier model during the...
متن کاملMulti-hypothesis nearest-neighbor classifier based on class-conditional weighted distance metric
The performance of nearest-neighbor (NN) classifiers is known to be very sensitive to the distance metric used in classifying a query pattern, especially in scarce-prototype cases. In this paper, a classconditional weighted (CCW) distance metric related to both the class labels of the prototypes and the query patterns is proposed. Compared with the existing distance metrics, the proposed metric...
متن کاملDifferential Evolution Based Multiple Vector Prototype Classifier
In this article we introduce differential evolution based multiple vector prototype classifier (shortly MVDE). In this method we extend the previous DE classifier so that it can handle several class vectors in one class. Classification problems which are so complex that they are simply not separable by using distance based algorithms e.g. differential evolution (DE) classifier or support vector...
متن کاملIntegrating a differential evolution feature weighting scheme into prototype generation
Prototype generation techniques have arisen as very competitive methods for enhancing the nearest neighbor classifier through data reduction. Within the prototype generation methodology, the methods of adjusting the prototypes’ positioning have shown an outstanding performance. Evolutionary algorithms have been used to optimize the positioning of the prototypes with promising results. prototype...
متن کاملRegularized margin-based conditional log-likelihood loss for prototype learning
The classification performance of nearest prototype classifiers largely relies on the prototype learning algorithm. The minimum classification error (MCE) method and the soft nearest prototype classifier (SNPC) method are two important algorithms using misclassification loss. This paper proposes a new prototype learning algorithm based on the conditional log-likelihood loss (CLL), which is base...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Expert Syst. Appl.
دوره 39 شماره
صفحات -
تاریخ انتشار 2012